Importance Gaussian Quadrature
نویسندگان
چکیده
Importance sampling (IS) and numerical integration methods are usually employed for approximating moments of complicated target distributions. In its basic procedure, the IS methodology randomly draws samples from a proposal distribution weights them accordingly, accounting mismatch between proposal. this work, we present general framework techniques inspired by methodology. The can also be seen as an incorporation deterministic rules into methods, reducing error estimators several orders magnitude in problems interest. proposed approach extends range applicability Gaussian quadrature rules. For instance, perspective allows us to use Gauss-Hermite where integrand is not involving distribution, even more, when only evaluated up normalizing constant, it case Bayesian inference. novel makes recent advances on multiple (MIS) adaptive (AIS) literatures, incorporates wider that combines iteratively adapted. We analyze convergence algorithms provide some representative examples showing superiority terms performance.
منابع مشابه
Anti-Gaussian quadrature formulas
An anti-Gaussian quadrature formula is an (n+ 1)-point formula of degree 2n− 1 which integrates polynomials of degree up to 2n+ 1 with an error equal in magnitude but of opposite sign to that of the n-point Gaussian formula. Its intended application is to estimate the error incurred in Gaussian integration by halving the difference between the results obtained from the two formulas. We show tha...
متن کاملTrigonometric and Gaussian Quadrature
Some relationships are established between trigonometric quadrature and various classical quadrature formulas. In particular Gauss-Legendre quadrature is shown to be a limiting case of trigonometric quadrature. In an earlier paper [1] it was noted that there exist trigonometric and exponential analogs of Gaussian quadrature formulas. We now extend those results to show several interesting featu...
متن کاملGaussian Quadrature for Kernel Features
Kernel methods have recently attracted resurgent interest, showing performance competitive with deep neural networks in tasks such as speech recognition. The random Fourier features map is a technique commonly used to scale up kernel machines, but employing the randomized feature map means that O(ε-2) samples are required to achieve an approximation error of at most ε. We investigate some alter...
متن کاملذخیره در منابع من
با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید
ژورنال
عنوان ژورنال: IEEE Transactions on Signal Processing
سال: 2021
ISSN: ['1053-587X', '1941-0476']
DOI: https://doi.org/10.1109/tsp.2020.3045526